Goto

Collaborating Authors

 lithium ion


Universal Machine Learning Interatomic Potentials are Ready for Solid Ion Conductors

Du, Hongwei, Hui, Jian, Zhang, Lanting, Wang, Hong

arXiv.org Artificial Intelligence

With the rapid development of energy storage technology, high-performance solid-state electrolytes (SSEs) have become critical for next-generation lithium-ion batteries. These materials require high ionic conductivity, excellent electrochemical stability, and good mechanical properties to meet the demands of electric vehicles and portable electronics. However, traditional methods like density functional theory (DFT) and empirical force fields face challenges such as high computational costs, poor scalability, and limited accuracy across material systems. Universal machine learning interatomic potentials (uMLIPs) offer a promising solution with their efficiency and near-DFT-level accuracy.This study systematically evaluates six advanced uMLIP models (MatterSim, MACE, SevenNet, CHGNet, M3GNet, and ORBFF) in terms of energy, forces, thermodynamic properties, elastic moduli, and lithium-ion diffusion behavior. The results show that MatterSim outperforms others in nearly all metrics, particularly in complex material systems, demonstrating superior accuracy and physical consistency. Other models exhibit significant deviations due to issues like energy inconsistency or insufficient training data coverage.Further analysis reveals that MatterSim achieves excellent agreement with reference values in lithium-ion diffusivity calculations, especially at room temperature. Studies on Li3YCl6 and Li6PS5Cl uncover how crystal structure, anion disorder levels, and Na/Li arrangements influence ionic conductivity. Appropriate S/Cl disorder levels and optimized Na/Li arrangements enhance diffusion pathway connectivity, improving overall ionic transport performance.


Research Bits: April 19

#artificialintelligence

Processor power prediction Researchers from Duke University, Arm Research, and Texas A&M University developed an AI method for predicting the power consumption of a processor, returning results more than a trillion times per second while consuming very little power itself. "This is an intensively studied problem that has traditionally relied on extra circuitry to address," said Zhiyao Xie, a PhD candidate at Duke. "But our approach runs directly on the microprocessor in the background, which opens many new opportunities. I think that's why people are excited about it." The approach, called APOLLO, uses an AI algorithm to identify and select just 100 of a processor's millions of signals that correlate most closely with its power consumption. It then builds a power consumption model off of those 100 signals and monitors them to predict the entire chip's performance in real-time.


Harnessing Machine Learning to Accelerate Fast-Charging Battery Design

#artificialintelligence

According to a new study in the journal Nature Materials, researchers from Stanford University have harnessed the power of machine learning technology to reverse long-held suppositions about the way lithium-ion batteries charge and discharge, providing engineers with a new list of criteria for making longer-lasting battery cells. This is the first time machine learning has been coupled with knowledge obtained from experiments and physics equations to uncover and describe how lithium-ion batteries degrade over their lifetime. Machine learning accelerates analyses by finding patterns in large amounts of data. In this instance, researchers taught the machine to study the physics of a battery failure mechanism to design superior and safer fast-charging battery packs. Fast charging can be stressful and harmful to lithium-ion batteries, and resolving this problem is vital to the fight against climate change.